3 research outputs found

    EEG analysis – automatic spike detection

    Get PDF
    In the diagnosis and treatment of epilepsy, an electroencephalography (EEG) is one of the main tools. However visual inspection of EEG is very time consuming. Automatic extraction of important EEG features saves not only a lot of time for neurologist, but also enables a whole new level for EEG analysis, by using data mining methods. In this work we present and analyse methods to extract some of these features of EEG – drowsiness score and centrotemporal spikes. For spike detection, a method based on morphological filters is used. Also a database design is proposed in order to allow easy EEG analysis and provide data accessibility for data mining algorithms developed in the future

    The Data Quality Monitoring Software for the CMS experiment at the LHC: past, present and future

    No full text
    The Data Quality Monitoring software is a central tool in the CMS experiment. It is used in the following key environments: (i) Online, for real-time detector monitoring; (ii) Offline, for the prompt-offline-feedback and final fine-grained data quality analysis and certification; (iii) Validation of all the reconstruction software production releases; (iv) Validation in Monte Carlo productions. Though the basic structure of the Run1 DQM system remains the same for Run2, between the Run1 and Run2 periods, the DQM system underwent substantial upgrades in many areas, not only to adapt to the surrounding infrastructure changes, but also to provide improvements to meet the growing needs of the collaboration with an emphasis on more sophisticated methods for evaluating data quality. We need to cope with the higher-energy and -luminosity proton-proton collision data, as well as the data from various special runs, such as Heavy Ion runs. In this contribution, we will describe the current DQM software, structure and workflow in the different environments. We then discuss the performance and our experiences with the DQM system in Run2. The main technical challenges which we have encountered and the solutions adopted during Run2 will also be discussed, including efficient use of memory in multithreading environments. Finally, we present the prospect of a future DQM upgrade with emphasis on functionality and long-term robustness for LHC Run3

    The Data Quality Monitoring software for the CMS experiment at the LHC: past, present and future

    No full text
    The Data Quality Monitoring software is a central tool in the CMS experiment. It is used in the following key environments: (i) Online, for real-time detector monitoring; (ii) Offline, for the prompt-offline-feedback and final fine-grained data quality analysis and certification; (iii) Validation of all the reconstruction software production releases; (iv) Validation in Monte Carlo productions. Though the basic structure of the Run1 DQM system remains the same for Run2, between the Run1 and Run2 periods, the DQM system underwent substantial upgrades in many areas, not only to adapt to the surrounding infrastructure changes, but also to provide improvements to meet the growing needs of the collaboration with an emphasis on more sophisticated methods for evaluating data quality. We need to cope with the higher-energy and -luminosity proton-proton collision data, as well as the data from various special runs, such as Heavy Ion runs. In this contribution, we will describe the current DQM software, structure and workflow in the different environments. We then discuss the performance and our experiences with the DQM system in Run2. The main technical challenges which we have encountered and the solutions adopted during Run2 will also be discussed, including efficient use of memory in multithreading environments. Finally, we present the prospect of a future DQM upgrade with emphasis on functionality and long-term robustness for LHC Run3
    corecore